Hellinger distance for fuzzy measures
نویسندگان
چکیده
Hellinger distance is a distance between two additive measures defined in terms of the RadonNikodym derivative of these two measures. This measure proposed in 1909 has been used in a large variety of contexts. In this paper we define an analogous measure for fuzzy measures. We discuss them for distorted probabilities and give two examples.
منابع مشابه
Comparing fuzzy measures through their Möbius transform
Fuzzy measures and integrals have been used in multiple applications in the area of information fusion. They can be used to aggregate information when information sources are not independent. Fuzzy measures are used to represent our background knowledge on the information sources. In particular, they can be used to model the dependencies between the variables. One of the applications of Choquet...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملNew distance and similarity measures for hesitant fuzzy soft sets
The hesitant fuzzy soft set (HFSS), as a combination of hesitant fuzzy and soft sets, is regarded as a useful tool for dealing with the uncertainty and ambiguity of real-world problems. In HFSSs, each element is defined in terms of several parameters with arbitrary membership degrees. In addition, distance and similarity measures are considered as the important tools in different areas such as ...
متن کاملOptimal Entropy-Transport problems and a new Hellinger-Kantorovich distance between positive measures
We develop a full theory for the new class of Optimal Entropy-Transport problems between nonnegative and finite Radon measures in general topological spaces. They arise quite naturally by relaxing the marginal constraints typical of Optimal Transport problems: given a couple of finite measures (with possibly different total mass), one looks for minimizers of the sum of a linear transport functi...
متن کاملSome inequalities for information divergence and related measures of discrimination
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
متن کامل